aterm-dbi/query
. The query service
is an ATermService . It accepts a Query
term in the body of a HTTP POST request and returns the result of
the query in the body of the reponse.
For example this query:
Query("SELECT ID, Name FROM Customer")Might result in this
ResultSet
:
ResultSet( MetaData( Columns( [ Column(Name("id"), Type(Std("INTEGER"), Db("int4"))) , Column(Name("name"), Type(Std("CHAR"), Db("bpchar"))) ] ) ) , Data( [ Row([160, "Meindert Kroese"]) , Row([159, "Steven van Dijk"]) , Row([158, "Martijn Schrage"]) , Row([157, "Sandor Spruit" ]) ] ) )The Stratego.ATerm Database Interface rewrites values of SQL types to an Stratego.ATerm representation. The current implementation can just handle integer types (INTEGER, SMALLINT, and BIGINT) and string types (CHAR, VARCHAR, and LONGVARCHAR). This will be extended in the future to handle all SQL data types.
http-transform
strategy of StrategoNetworking.
There is one problem: the Java implementation of the ATermLibrary doesn't provide a reader for BAF, the Binary Stratego.ATerm Format?. Because of this format is very efficient, it is used by XTC by default. You should thus invoke the services of aterm-dbi in TAF, the Textual Stratego.ATerm Format?. You could do this with this strategy:
http-text-transform(service-url) = xtc-temp-files( write-to-text ; xtc-http-transform(service-url) ; read-from )Invocation of the query service of aterm-dbi:
<http-text-transform(!URL("http://127.0.0.1:8080/aterm-dbi/query"))> Query("SELECT ID, Name FROM Customer")
war
file is a Java Web Archive, a binary distribution. You can immediately put this file in your servlet container. The .tar.gz files are source distributions. You need Ant to build the sources.
You can also get the latest sources from the Subversion repository:
svn checkout %SVNSTRATEGOXT%/trunk/experimental/aterm-dbiSee latest sources? for more information on how to checkout packages from the StrategoXT subversion repository. In the samples package of Stratego Networking? you can find some samples in the
xmpl/aterm-dbi
directory (unreleased, use the Subversion repository).
DataSource
that is made available by the application server it is hosted in via the JNDI.
Advantages of this approach:
common/lib
directory of your Tomcat installation.
Now it's finally time to install the web archive (.war) of aterm-dbi. You should
put this war in the webapps
directory of Tomcat.
We still have to configure Tomcat with the database you want to use.
This is application server and database specific, so you will probably
need to consult some other documentation.
In Tomcat you should add this to the Host
element with the name
attribute localhost
:
<Context path="/aterm-dbi" docBase="aterm-dbi.war" debug="5" reloadable="true"> <Logger className="org.apache.catalina.logger.FileLogger" prefix="aterm-dbi-log." suffix=".txt" timestamp="true"/> <Resource name="jdbc/aterm-dbi" auth="Container" type="org.postgresql.jdbc3.Jdbc3PoolingDataSource"/> <ResourceParams name="jdbc/aterm-dbi"> <parameter> <name>factory</name> <value>org.postgresql.jdbc3.Jdbc3ObjectFactory</value> </parameter> <parameter> <name>databaseName</name><value>.....</value> </parameter> <parameter> <name>serverName</name><value>127.0.0.1</value> </parameter> <parameter> <name>user</name><value>....</value> </parameter> <parameter> <name>password</name><value>.....</value> </parameter> </ResourceParams> </Context>Of course you should fill in your database-, server-, and username and your password. We've chosen the
PoolingDataSource
implementation of the PostgreSQL JDBC driver,
but if your application server offers a connection pooling implementation
which interfaces with a ConnectionPoolDataSource
, you should choose that option.
t := bt -- basic term | bt { t } -- annotated term bt := C -- constant | C(t1,...,tn) -- n-ary constructor | (t1,...,tn) -- n-ary tuple | [t1,...,tn] -- list | "ccc" -- quoted string | int -- integer | real -- floating point number | blob -- binary large objectHere
C
is a constructor name, which is either an identifier or a quoted string.
baffle
tool in the ATerm Library can be used to convert the formats.
aterm-java
.
$ aterm2xml -i foo.trm # convert to explicit xml $ aterm2xml -i foo.trm | xml2aterm # convert to explicit xml and back $ aterm2xml -i foo.trm --implicit # convert to implicit xml $ aterm2xml -i foo.trm --very-explicit # convert to very explicit xml
ATerm | Explicit XML | Back to ATerm |
---|---|---|
foo |
<foo xmlns:at="http://aterm.org"/> |
foo |
foo(1) |
<foo xmlns:at="http://aterm.org"> <at:int>1</at:int> </foo> |
foo(1) |
1 |
<at:int xmlns:at="http://aterm.org">1</at:int> |
1 |
"abc" |
<at:string xmlns:at="http://aterm.org">abc</at:string> |
"abc" |
() |
<at:tuple xmlns:at="http://aterm.org"/> |
() |
(1, 2) |
<at:tuple xmlns:at="http://aterm.org"> <at:int>1</at:int> <at:int>2</at:int> </at:tuple> |
(1,2) |
[] |
<at:list xmlns:at="http://aterm.org"/> |
[] |
[1, 2] |
<at:list xmlns:at="http://aterm.org"> <at:int>1</at:int> <at:int>2</at:int> </at:list> |
[1,2] |
fred([foo, bar]) |
<fred xmlns:at="http://aterm.org"> <at:list> <foo/> <bar/> </at:list> </fred> |
fred([foo,bar]) |
fred(None, [foo, bar]) |
<fred xmlns:at="http://aterm.org"> <None/> <at:list> <foo/> <bar/> </at:list> </fred> |
fred(None,[foo,bar]) |
fred(Some(barney), [foo, bar]) |
<fred xmlns:at="http://aterm.org"> <Some> <barney/> </Some> <at:list> <foo/> <bar/> </at:list> </fred> |
fred(Some(barney),[foo,bar]) |
fred("foo", "bar") |
<fred xmlns:at="http://aterm.org"> <at:string>foo</at:string> <at:string>bar</at:string> </fred> |
fred("foo","bar") |
foo{fred} |
<foo xmlns:at="http://aterm.org"> <at:anno> <fred/> </at:anno> </foo> |
foo{fred} |
ATerm | Very Explicit XML | Back to ATerm |
---|---|---|
foo |
<at:appl xmlns:at="http://aterm.org" at:fun="foo"/> |
foo |
foo(1) |
<at:appl xmlns:at="http://aterm.org" at:fun="foo"> <at:int> <at:value>1</at:value> </at:int> </at:appl> |
foo(1) |
1 |
<at:int xmlns:at="http://aterm.org"> <at:value>1</at:value> </at:int> |
1 |
"abc" |
<at:string xmlns:at="http://aterm.org"> <at:value>abc</at:value> </at:string> |
"abc" |
() |
<at:tuple xmlns:at="http://aterm.org"/> |
() |
(1, 2) |
<at:tuple xmlns:at="http://aterm.org"> <at:int> <at:value>1</at:value> </at:int> <at:int> <at:value>2</at:value> </at:int> </at:tuple> |
(1,2) |
[] |
<at:list xmlns:at="http://aterm.org"/> |
[] |
[1, 2] |
<at:list xmlns:at="http://aterm.org"> <at:int> <at:value>1</at:value> </at:int> <at:int> <at:value>2</at:value> </at:int> </at:list> |
[1,2] |
fred([foo, bar]) |
<at:appl xmlns:at="http://aterm.org" at:fun="fred"> <at:list> <at:appl at:fun="foo"/> <at:appl at:fun="bar"/> </at:list> </at:appl> |
fred([foo,bar]) |
fred(None, [foo, bar]) |
<at:appl xmlns:at="http://aterm.org" at:fun="fred"> <at:appl at:fun="None"/> <at:list> <at:appl at:fun="foo"/> <at:appl at:fun="bar"/> </at:list> </at:appl> |
fred(None,[foo,bar]) |
fred(Some(barney), [foo, bar]) |
<at:appl xmlns:at="http://aterm.org" at:fun="fred"> <at:appl at:fun="Some"> <at:appl at:fun="barney"/> </at:appl> <at:list> <at:appl at:fun="foo"/> <at:appl at:fun="bar"/> </at:list> </at:appl> |
fred(Some(barney),[foo,bar]) |
fred("foo", "bar") |
<at:appl xmlns:at="http://aterm.org" at:fun="fred"> <at:string> <at:value>foo</at:value> </at:string> <at:string> <at:value>bar</at:value> </at:string> </at:appl> |
fred("foo","bar") |
foo{fred} |
<at:appl xmlns:at="http://aterm.org" at:fun="foo"> <at:anno> <at:appl at:fun="fred"/> </at:anno> </at:appl> |
foo{fred} |
ATerm | Implicit XML | Back to ATerm |
---|---|---|
foo |
<foo xmlns:at="http://aterm.org"/> |
foo |
foo(1) |
<foo xmlns:at="http://aterm.org">1</foo> |
foo("1") |
1 |
not possible | not possible |
"abc" |
not possible | not possible |
() |
not possible | not possible |
(1, 2) |
not possible | not possible |
[] |
not possible | not possible |
[1, 2] |
not possible | not possible |
fred([foo, bar]) |
<fred xmlns:at="http://aterm.org"> <foo/> <bar/> </fred> |
fred(foo,bar) |
fred(None, [foo, bar]) |
<fred xmlns:at="http://aterm.org"> <foo/> <bar/> </fred> |
fred(foo,bar) |
fred(Some(barney), [foo, bar]) |
<fred xmlns:at="http://aterm.org"> <barney/> <foo/> <bar/> </fred> |
fred(barney,foo,bar) |
fred("foo", "bar") |
<fred xmlns:at="http://aterm.org">foobar</fred> |
fred("foobar") |
foo{fred} |
<foo xmlns:at="http://aterm.org"> <at:anno> <fred/> </at:anno> </foo> |
foo{fred} |
[ ["\\/", "\\ensuremath{\\vee}"] ]Multiple abbreviation tables can be passed to abox2latex.
component koala-bundle (BUNDLE) { path = koala-bundle } { } component asfix-tools (ASFIX-TOOLS) { path = koala-bundle/asfix-tools url = file:///home/mdejonge/pkgs } { IbinASFIX-TOOLS : asfix-tools -> koala-bundle/asfix-tools/asfix-tools-1_0 IlibATERM : aterm -> koala-bundle/aterm IlibXTC : xtc -> koala-bundle/stratego IlibSRTS : srts -> koala-bundle/stratego } module asfix-tools-1_0 { path = koala-bundle/asfix-tools/asfix-tools-1_0 url = file:///home/mdejonge/pkgs } { IlibATERM : aterm -> koala-bundle/asfix-tools IlibXTC : xtc -> koala-bundle/asfix-tools IlibSRTS : srts -> koala-bundle/asfix-tools } component aterm (ATerm) { path = koala-bundle/aterm url = file:///home/mdejonge/pkgs } { IlibATERM : aterm -> koala-bundle/aterm/aterm-1_0 } module aterm-1_0 { path = koala-bundle/aterm/aterm-1_0 url = file:///home/mdejonge/pkgs } { } component gpp (GPP) { path = koala-bundle/gpp url = file:///home/mdejonge/pkgs } { IbinGPP : gpp -> koala-bundle/gpp/gpp-1_0 IlibATERM : aterm -> koala-bundle/aterm IbinASFIX-TOOLS : asfix-tools -> koala-bundle/asfix-tools IbinSGLR : sglr -> koala-bundle/sglr IbinGRAPH-TOOLS : graph-tools -> koala-bundle/graph-tools IlibSRTS : srts -> koala-bundle/stratego IlibXTC : xtc -> koala-bundle/stratego } module gpp-1_0 { path = koala-bundle/gpp/gpp-1_0 url = file:///home/mdejonge/pkgs } { IlibATERM : aterm -> koala-bundle/gpp IbinASFIX-TOOLS : asfix-tools -> koala-bundle/gpp IbinSGLR : sglr -> koala-bundle/gpp IbinGRAPH-TOOLS : graph-tools -> koala-bundle/gpp IlibSRTS : srts -> koala-bundle/gpp IlibXTC : xtc -> koala-bundle/gpp } component graph-tools (GRAPH-TOOLS) { path = koala-bundle/graph-tools url = file:///home/mdejonge/pkgs } { IbinGRAPH-TOOLS : graph-tools -> koala-bundle/graph-tools/graph-tools-1_0 IlibATERM : aterm -> koala-bundle/aterm IlibSRTS : srts -> koala-bundle/stratego IlibXTC : xtc -> koala-bundle/stratego } module graph-tools-1_0 { path = koala-bundle/graph-tools/graph-tools-1_0 url = file:///home/mdejonge/pkgs } { IlibATERM : aterm -> koala-bundle/graph-tools IlibXTC : xtc -> koala-bundle/graph-tools IlibSRTS : srts -> koala-bundle/graph-tools } component sglr (mySGLR) { path = koala-bundle/sglr url = file:///home/mdejonge/pkgs } { IbinSGLR : sglr -> koala-bundle/sglr/sglr-1_0 IlibPT-SUPPORT : pt-support -> koala-bundle/sglr/pt-support-1_0 IlibTOOLBUS-LIB : toolbus-lib -> koala-bundle/sglr/toolbus-lib-1_0 IlibATERM : aterm -> koala-bundle/aterm } module sglr-1_0 { path = koala-bundle/sglr/sglr-1_0 url = file:///home/mdejonge/pkgs } { IlibPT-SUPPORT : pt-support -> koala-bundle/sglr IlibTOOLBUS-LIB : toolbus-lib -> koala-bundle/sglr IlibATERM : aterm -> koala-bundle/sglr } module pt-support-1_0 { path = koala-bundle/sglr/pt-support-1_0 url = file:///home/mdejonge/pkgs } { IlibTOOLBUS-LIB : toolbus-lib -> koala-bundle/sglr IlibATERM : aterm -> koala-bundle/sglr } module toolbus-lib-1_0 { path = koala-bundle/sglr/toolbus-lib-1_0 url = file:///home/mdejonge/pkgs } { IlibATERM : aterm -> koala-bundle/sglr } component stratego (Stratego) { path = koala-bundle/stratego } { IlibSRTS : srts -> koala-bundle/stratego/my_srts IlibXTC : xtc -> koala-bundle/stratego/my_xtc IlibATERM : aterm -> koala-bundle/aterm } component my_xtc (XTC) { path = koala-bundle/stratego/my_xtc url = file:///home/mdejonge/pkgs } { IlibXTC : xtc -> koala-bundle/stratego/my_xtc/xtc-1_0 IlibATERM : aterm -> koala-bundle/stratego IlibSRTS : srts -> koala-bundle/stratego } module xtc-1_0 { path = koala-bundle/stratego/my_xtc/xtc-1_0 url = file:///home/mdejonge/pkgs } { IlibATERM : aterm -> koala-bundle/stratego/my_xtc IlibSRTS : srts -> koala-bundle/stratego/my_xtc } component my_srts (SRTS) { path = koala-bundle/stratego/my_srts url = file:///home/mdejonge/pkgs } { IlibSRTS : srts -> koala-bundle/stratego/my_srts/srts-1_0 IlibATERM : aterm -> koala-bundle/stratego } module srts-1_0 { path = koala-bundle/stratego/my_srts/srts-1_0 url = file:///home/mdejonge/pkgs } { IlibATERM : aterm -> koala-bundle/stratego/my_srts } component koala (KOALA) { path = koala-bundle/koala url = file:///home/mdejonge/pkgs } { IbinKOALA : koala -> koala-bundle/koala/koala-1_0 IbinGPP : gpp -> koala-bundle/gpp IbinSGLR : sglr -> koala-bundle/sglr IlibSRTS : srts -> koala-bundle/stratego IlibXTC : xtc -> koala-bundle/stratego IlibATERM : aterm -> koala-bundle/aterm } module koala-1_0 { path = koala-bundle/koala/koala-1_0 url = file:///home/mdejonge/pkgs } { IbinGPP : gpp -> koala-bundle/koala IbinSGLR : sglr -> koala-bundle/koala IlibSRTS : srts -> koala-bundle/koala IlibXTC : xtc -> koala-bundle/koala IlibATERM : aterm -> koala-bundle/koala }
definition module Main exports context-free start-symbols Stm* sorts Id IntConst lexical syntax [\ \t\n] -> LAYOUT [a-zA-Z]+ -> Id [0-9]+ -> IntConst sorts Stm context-free syntax Id ":=" Exp -> Stm {cons("Assign")} sorts Exp context-free syntax Id -> Exp {cons("Var")} IntConst -> Exp {cons("Int")} Exp "+" Exp -> Exp {left, cons("Plus")}Generate a parse table:
> sdf2table -i Foo.def -o Foo.tblInput file:
x := 5 y := x + 4 z := x + yParse the input and have a look at the result in abstract syntax:
> sglr -i foo.txt -p Foo.tbl | addPosInfo -p ./foo.txt -m | implodePT | pp-atermNotes:
-p
) of the input file.
-2
argument.
ambtracker
offers an alternative visualization of ambiguities in a parse tree. ambtracker
just displays the productions causing the ambiguity and not how these productions are applied to the actual input. The location (row and column) in the input where an ambiguity is caused is also displayed for easy reference.
Consider the syntax definition that is used to illustrate visamb.
definition module Main exports sorts Exp lexical syntax [\ \t\n] -> LAYOUT context-free syntax "id" -> Exp Exp Exp -> ExpFrom this syntax definition an SGLR parse table can be generated:
sdf2table -i Exp.sdf -o Exp.tblThe ambiguities of the phrase
id id id
can be shown with:
echo "id id id" | sglr -2 -p Exp.tbl | ambtrackerthe output of this command is:
1 ambiguity cluster: [1/1] at (1:0): Exp Exp -> Exp Exp Exp -> Exp
-2
flag is used.
In StrategoXT program transformation systems usually operate on abstract syntax trees. The AsFix2 output of SGLR is transformed into an abstract syntax tree by implode-asfix.
definition module Exp exports sorts Exp lexical syntax [\ \t\n] -> LAYOUT [a-zA-Z]+ -> Id [0-9]+ -> IntConst context-free syntax Id -> Exp {cons("Var")} IntConst -> Exp {cons("Int")} Exp "*" Exp -> Exp {left, cons("Mul")} Exp "/" Exp -> Exp {left, cons("Div")} Exp "%" Exp -> Exp {left, cons("Mod")} Exp "+" Exp -> Exp {left, cons("Plus")} Exp "-" Exp -> Exp {left, cons("Minus")} context-free priorities {left: Exp "*" Exp -> Exp Exp "/" Exp -> Exp Exp "%" Exp -> Exp } > {left: Exp "+" Exp -> Exp Exp "-" Exp -> Exp }PGEN's
sdf2table
produces an SGLR parse table from this syntax definition:
sdf2table -m Exp -i Exp.def -o Exp.tbl
1 + a
to the compact AsFix variant AsFix2ME using:
echo "1 + a" | sglr -m -A -p Exp.tbl | pp-atermproduces the following parse tree:
parsetree( appl( prod( [cf(opt(layout)), cf(sort("Exp")), cf(opt(layout))] , sort("<START>") , no-attrs ) , [ appl(prod([], cf(opt(layout)), no-attrs), []) , appl( prod( [ cf(sort("Exp")) , cf(opt(layout)) , lit("+") , cf(opt(layout)) , cf(sort("Exp")) ] , cf(sort("Exp")) , attrs([assoc(left), term(cons("Plus"))]) ) , [ appl( prod( [cf(sort("IntConst"))] , cf(sort("Exp")) , attrs([term(cons("Int"))]) ) , [ appl( prod( [lex(sort("IntConst"))] , cf(sort("IntConst")) , no-attrs ) , [ appl( list(iter-star(char-class([range(0, 255)]))) , [49] ) ] ) ] ) , appl( prod([cf(layout)], cf(opt(layout)), no-attrs) , [32] ) , lit("+") , appl( prod([cf(layout)], cf(opt(layout)), no-attrs) , [32] ) , appl( prod( [cf(sort("Id"))] , cf(sort("Exp")) , attrs([term(cons("Var"))]) ) , [ appl( prod( [lex(sort("Id"))] , cf(sort("Id")) , no-attrs ) , [ appl( list(iter-star(char-class([range(0, 255)]))) , [97] ) ] ) ] ) ] ) , appl( prod([cf(layout)], cf(opt(layout)), no-attrs) , [10] ) ] ) , 0 )
echo "1 + a" | sglr -2 -A -p Exp.tbl | pp-atermresults in:
parsetree( appl( prod( [cf(opt(layout)), cf(sort("Exp")), cf(opt(layout))] , sort("<START>") , no-attrs ) , [ appl(prod([], cf(opt(layout)), no-attrs), []) , appl( prod( [ cf(sort("Exp")) , cf(opt(layout)) , lit("+") , cf(opt(layout)) , cf(sort("Exp")) ] , cf(sort("Exp")) , attrs([assoc(left), term(cons("Plus"))]) ) , [ appl( prod( [cf(sort("IntConst"))] , cf(sort("Exp")) , attrs([term(cons("Int"))]) ) , [ appl( prod( [lex(sort("IntConst"))] , cf(sort("IntConst")) , no-attrs ) , [ appl( prod( [lex(iter(char-class([range(48, 57)])))] , lex(sort("IntConst")) , no-attrs ) , [ appl( prod( [char-class([range(48, 57)])] , lex(iter(char-class([range(48, 57)]))) , no-attrs ) , [49] ) ] ) ] ) ] ) , appl( prod([cf(layout)], cf(opt(layout)), no-attrs) , [ appl( prod([lex(layout)], cf(layout), no-attrs) , [ appl( prod( [char-class([range(9, 10), 32])] , lex(layout) , no-attrs ) , [32] ) ] ) ] ) , appl( prod([char-class([43])], lit("+"), no-attrs) , [43] ) , appl( prod([cf(layout)], cf(opt(layout)), no-attrs) , [ appl( prod([lex(layout)], cf(layout), no-attrs) , [ appl( prod( [char-class([range(9, 10), 32])] , lex(layout) , no-attrs ) , [32] ) ] ) ] ) , appl( prod( [cf(sort("Id"))] , cf(sort("Exp")) , attrs([term(cons("Var"))]) ) , [ appl( prod( [lex(sort("Id"))] , cf(sort("Id")) , no-attrs ) , [ appl( prod( [ lex( iter( char-class([range(65, 90), range(97, 122)]) ) ) ] , lex(sort("Id")) , no-attrs ) , [ appl( prod( [char-class([range(65, 90), range(97, 122)])] , lex( iter( char-class([range(65, 90), range(97, 122)]) ) ) , no-attrs ) , [97] ) ] ) ] ) ] ) ] ) , appl( prod([cf(layout)], cf(opt(layout)), no-attrs) , [ appl( prod([lex(layout)], cf(layout), no-attrs) , [ appl( prod( [char-class([range(9, 10), 32])] , lex(layout) , no-attrs ) , [10] ) ] ) ] ) ] ) , 0 )
echo "1 + a" | sglr -2A -p Exp.tbl | implode-asfix | pp-atermresults in:
Plus(Int("1"), Var("a"))implode-asfix only accepts AsFix2. The implodePT (part of the pt-support package, which is in the sdf2-bundle) implements the same implosion for AsFix2ME.
echo "1 + a" | sglr -mA -p Exp.tbl | implodePT | pp-atermproduces
Plus(Int("1"), Var("a"))
definition module Exp exports sorts Exp context-free syntax "nil" -> Exp lexical syntax [\t\n\r\ ] -> LAYOUT
appl( prod([lit("nil")], cf(sort("Exp")), no-attrs) , [lit("nil")] )
appl( prod([lit("nil")], cf(sort("Exp")), no-attrs) , [ appl( prod( [char-class([110]), char-class([105]), char-class([108])] , lit("nil") , no-attrs ) , [110, 105, 108] ) ] )
definition module Exp hiddens sorts IntConst exports sorts Exp lexical syntax [\ \t\n] -> LAYOUT [0-9]+ -> IntConst context-free syntax IntConst -> Exp {cons("Int")}
sdf2table -i Exp-tiny.def.def -o Exp-tiny.tbl -m ExpStratego Script to extract the tree related to the context-free symbol IntConst:
import simple-traversal ;; oncetd(?appl(prod(_, cf(sort("IntConst")), _), _); ?x) ;; !x ;;Parse the expression "12" to AsFix2?
echo "1" | sglr -2 -p Exp-tiny.tbl | stratego-shell --script extract-lex.srts | pp-aterm
appl( prod( [lex(sort("IntConst"))] , cf(sort("IntConst")) , no-attrs ) , [ appl( prod( [lex(iter(char-class([range(48, 57)])))] , lex(sort("IntConst")) , no-attrs ) , [ appl( prod( [ lex(iter(char-class([range(48, 57)]))) , lex(iter(char-class([range(48, 57)]))) ] , lex(iter(char-class([range(48, 57)]))) , attrs([assoc(left)]) ) , [ appl( prod( [char-class([range(48, 57)])] , lex(iter(char-class([range(48, 57)]))) , no-attrs ) , [49] ) , appl( prod( [char-class([range(48, 57)])] , lex(iter(char-class([range(48, 57)]))) , no-attrs ) , [50] ) ] ) ] ) ] )
echo "12" | sglr -m -p Exp-tiny.tbl | stratego-shell --script extract-cf.strs | pp-aterm
appl( prod( [lex(sort("IntConst"))] , cf(sort("IntConst")) , no-attrs ) , [ appl( list(iter-star(char-class([range(0, 255)]))) , [49, 50] ) ] )
definition module Exp hiddens sorts DecConst exports sorts Exp lexical syntax [\ \t\n] -> LAYOUT [0-9]+ "." [0-9]+ -> DecConst {cons("DecConst")} context-free syntax DecConst -> Exp
echo "13.25" | sglr -m -p Exp-tiny.tbl | stratego-shell --script extract-cf.strs | pp-aterm
appl( prod( [lex(sort("DecConst"))] , cf(sort("DecConst")) , no-attrs ) , [ appl( list(iter-star(char-class([range(0, 255)]))) , [49, 51, 46, 50, 53] ) ] )
definition module Exp hiddens sorts DecConst exports sorts Exp context-free syntax DecConst -> Exp syntax [0-9]+ "." [0-9]+ -> DecConst {cons("DecConst")} DecConst -> <DecConst-CF> lexical syntax [\t\n\r\ ] -> LAYOUT
/** * Voodoo */ class Voodoo { /** * Bla bla */ public void foo(/*let me explain this */ int x) { // just return return /* foo */ x; } }An example fragment of the AST:
Param([], Int, Id("x")){(Comment, "/*let me explain this */")}The JavaFront pretty-printer has been extended to support these Comment annotations. The following pipe:
sglr -2 -s CompilationUnit -p ~/wc/java-front/syn/v1.5/Java-15.tbl -i ~/Foo.java | ./asfix-anno-comments | implode-asfix | pp-javaproduces the following output:
/** * Voodoo */ class Voodoo { /** * Bla bla */ public void foo(/*let me explain this */ int x) { // just return return /* foo */ x; } }
myplatform="localhost"
'
in your platforms file.
----------------------------------------------------------------------------- RECOVERY OF SYNTAX DEFINITION FOR LEX ----------------------------------------------------------------------------- No syntax definition for LEX was available in the grammar-base. In order to further automate the translation of LEX/YACC grammars to SDF2 syntax definitions, such a syntax definition is needed. In this file I report the steps I took to get LEX in SDF2. -- Eelco Visser 2001/09/29 ----------------------------------------------------------------------------- [Step 1] Locate the sources ftp://ftp.gnu.org/non-gnu/flex/ [Step 2] Inspect the source cd flex-2.5.4 less parse.y [Step 3] Copy source to grammar base cp parse.y ~/res/XT/gb/grammars/lex.0 [Step 3] Parse the YACC (Bison) source > parse -l yacc -i lex.y -I -o lex.af => Error: charliteral '\n' not recognized. Repair syntax definition of YACC. [Step 4] Translate to AbstractSDF > parse -l yacc -i lex.y -I -o lex.af yacc2sdf -i lex.af -o lex.asdf [Step 5] Pretty-print syntax definition and inspect > sdf-bracket -i lex.asdf | pp -a -l sdf -o lex.def -v 2.1 less lex.def [Step 6] Regularize the syntax definition > parse -l yacc -i lex.y -I -o lex.af yacc2sdf -i lex.af -o lex.asdf sdf-regularize -i lex.asdf -o lex.reg.asdf sdf-bracket -i lex.reg.asdf | pp -a -l sdf -o lex.def -v 2.1 less lex.def [Step 7] Generate constructors > parse -l yacc -i lex.y -I -o lex.af yacc2sdf -i lex.af -o lex.asdf sdf-regularize -i lex.asdf -o lex.reg.asdf sdf-cons -i lex.reg.asdf -o lex.reg.cons.asdf sdf-bracket -i lex.reg.cons.asdf | pp -a -l sdf -o lex.def -v 2.1 less lex.def [Step 8] Edit the definition to define lexical syntax and improve constructors [Step 9] Unpack the definition to create separate SDF modules. Make check can then be used to generate lex.def and automatically parse various example files. Before doing this change the names of the modules Lexical and Generated into Lex-Symbols and Lex, respectively. > unpack-sdf lex.def [Step 10] Further edit the modules to define lexical syntax. => It turns rather hard to parse complete lex files. The lexical syntax is very tricky. Instead I decide to reduce the problem by editing lex files by hand to remove all irrelevant stuff and only leave definitions of the form |name re| and rules of the form |re return id;|. Newlines cannot be used as general layout, but are used to delimit definitions and rules. No superfluous newlines are allowed. => Succeed in parsing stratego.mod.l, a modified version of the stratego lexical syntax in lex. [Step 11] Improve the syntax definition to get good abstract syntax. Start with unfolding literals. > parse -l sdf -v 2.1 -I -i lex.def -o lex.adef unfold-literal -i lex.adef -o lex.unf.adef sdf-bracket -i lex.adef | pp -a -l sdf -o lex.def -v 2.1 less lex.def => Automatic application does not work, do it manually. (come back and repair unfold-literal later) [Step 12] Abstract syntax looks good. Install parse table such that it can be used with the parse tool of the grammar base. > make install parse -l lex -i data/stratego.mod.l -I [Step 13] Project finished. => Future work: - parse full lex definitions - generate a signature from the syntax definition
---------------------------------------------------------------------- RECOVERING A SYNTAX DEFINITION FOR STRATEGO ---------------------------------------------------------------------- This directory contains a syntax definition in SDF2 of the Stratego language. This file describes step by step how the syntax definition was obtained from the YACC source in the source tree of the Stratego Compiler SC. Including the search for tools, their usage, repair if necessary and the occasional implementation of a missing tool. -- Eelco Visser 1/10/2001 ---------------------------------------------------------------------- [step 1] Copy the YACC file > cp ../../sc/spec/syn/stratego.grm . [step 2] Parse the YACC file > parse -l yacc -i stratego.grm -I -o stratego.af [step 3] Translate YACC to SDF > yacc2sdf -i stratego.af -o stratego.asdf [step 4] Pretty-print the syntax definition > pp -l sdf -i stratego.asdf -o stratego.def [step 4a] Find out what is wrong > pp -h => -a switch should be used to indicate that input is abstract syntax [step 4c] Pretty-print the syntax definition with -a > pp -a -l sdf -i stratego.asdf -o stratego.def No pp entry found for: ["Definition"] rewriting failed => wrong pretty-print table [step 4d] Pretty-print the syntax definition with -a and sdf version 2.1 pp -a -l sdf -i stratego.asdf -o stratego.def -v 2.1 [step 5] Inspecting the generated syntax definition > less stratego.def => No templates for lexicals have been included. I remember that these were generated automatically from the %token declarations in the YACC file. => Yes, yacc2sdf is broken; the signature of trees produced by the parse has been changed. Repair this. => Problem was due to change in generation of constructors by sdf-cons. Adapted Yacc syntax definition; inlined injection Nmno* -> Nlist. [Step 6] Repeat steps 1 - 5 > parse -l yacc -i stratego.grm -I -o stratego.af yacc2sdf -i stratego.af -o stratego.asdf pp -a -l sdf -i stratego.asdf -o stratego.def -v 2.1 less stratego.def => Each template rule for token is put in its own lexical syntax section; merge these. Adapt yacc2sdf [Step 7] Repeat step 6 > parse -l yacc -i stratego.grm -I -o stratego.af yacc2sdf -i stratego.af -o stratego.asdf pp -a -l sdf -i stratego.asdf -o stratego.def -v 2.1 less stratego.def => Lexicals are now merged => No list detection and transformation is done by yacc2sdf. Which tool does achieve that? There used to be a deyaccification tool, which I can no longer find. Write a new tool: sdf-regularize which achieves this. Add it to the sdf-tools package. => sdf-regularize recognizes various kinds of lists and optional constructs and translates them into regular expressions. The sorts representing these regular expressions are then superfluous. By inlining their definitions the sorts are removed and the syntax definition is shortened. [Step 8] Add application of sdf-regularize and sdf-bracket > parse -l yacc -i stratego.grm -I -o stratego.af yacc2sdf -i stratego.af -o stratego.asdf sdf-regularize -i stratego.asdf -o stratego.reg.asdf sdf-bracket -i stratego.reg.asdf | pp -a -l sdf -o stratego.def -v 2.1 less stratego.def => The syntax definition looks good now, with much fewer productions. No definition for lexical syntax exists yet, however. [Step 9] Generate constructor annotations > parse -l yacc -i stratego.grm -I -o stratego.af yacc2sdf -i stratego.af -o stratego.asdf sdf-regularize -i stratego.asdf -o stratego.reg.asdf sdf-cons -i stratego.reg.asdf -o stratego.reg.cons.asdf sdf-bracket -i stratego.reg.cons.asdf | pp -a -l sdf -o stratego.def -v 2.1 less stratego.def => This produces bad results, since the heuristics are based on the literals in the productions; we need to add in the literals. Can this be done automatically? [Step 10] Find the LEX file > cp ../../sc/spec/syn/stratego.lx . => This looks like it could be translated into SDF2 mostly automatically. Let's find a Lex grammar. There is none in the grammar base. I guess we'll have to reverse engineer it. => Created a syntax definition for a subset of LEX that can deal with stripped files that only contain definitions and rules, but not arbitrary C code. The file stratego.mod.l contains the stripped off lex file for stratego. => The syntax definition for lex has been installed in the grammar base. We can now use the parse tool to parse stratego.mod.l [Step 11] Parse the LEX file > parse -l lex -i stratego.mod.l -o stratego.mod.af -I [Step 12] Translating LEX to SDF2 => There is no tool for this yet. We'll have to write it. => lex2sdf is new tool, implemented in grammar-recovery/src/yacc2sdf/ > parse -l lex -i stratego.mod.l -o stratego.mod.af -I lex2sdf -i stratego.mod.af -o stratego.mod.asdf sdf-bracket -i stratego.mod.asdf | pp -a -l sdf -o stratego.mod.sdf -v 2.1 less stratego.mod.sdf => This provides a good lexical syntax. Note that layout is missing. [Step 13] Recapitulation. The following actions were taken to derive the SDF2 definition so far. Note that the source file names have been renamed > mv stratego.grm stratego.l mv stratego.mod.l stratego.l => Context-free syntax > parse -l yacc -i stratego.y -I -o stratego-cfg.af yacc2sdf -i stratego-cfg.af -o stratego-cfg.asdf sdf-regularize -i stratego-cfg.asdf -o stratego-cfg.reg.asdf sdf-bracket -i stratego-cfg.reg.asdf | pp -a -l sdf -o stratego-cfg.def -v 2.1 less stratego-cfg.def => Lexical syntax > parse -l lex -i stratego.l -o stratego-lex.af -I lex2sdf -i stratego-lex.af -o stratego-lex.asdf sdf-bracket -i stratego-lex.asdf | pp -a -l sdf -o stratego-lex.def -v 2.1 less stratego-lex.def [Step 14] Combine lexical and context-free syntax => edit: remove module Lexical from stratego-cfg.def > unpack-sdf stratego-cfg.def unpack-sdf stratego-lex.def => edit: repair error in definition of Backslash > pack-sdf -i Main.sdf -I ./. -dep stratego.af [Step 14] Unfold literals; replace token names by their definitions. > pack-sdf -i Main.sdf -I ./. -dep stratego.af -o stratego.af pp -A -l sdf -i stratego.af -o stratego.def -v 2.1 > parse -l sdf -v 2.1 -i stratego.def -o stratego.af -I unfold-literal -i stratego.af -o stratego.unf.af sdf-bracket -i stratego.unf.af | pp -a -l sdf -o stratego.unf.def -v 2.1 less stratego.unf.def => unfold-literals did not work for all cases. Rewrote the tool using dynamic rules, which made the specification much shorter. => Use the result as the new Lexical.sdf and Stratego.sdf modules > unpack-sdf stratego.unf.def [Step 15] Clean up Lexical.sdf and Stratego.sdf manually => Fill in missing lexical syntax - layout definitions (in particular definition of literate comments) => Improve abstract syntax - unfold Optvarlist in StrategyDef - observations: often better to unfold optionals if the optional can also be expressed in terms of the - example: the condition |where id| is equivalent to a rule without condition - example: the strategy definition |f = s| is equivalent to |f() = s|, i.e., with an empty list of arguments - therefore the latter can be desugared into the former allowing uniform treatment without having to deal with None and Some constructors. - this is at the cost of extra productions, however, the *core* syntax is smaller => Add constructors > make check => Stratego files get parsed correctly it seems [Step 16] Compatability with parse-mod. To use the parser based on the SDF2 definition it should be compatible with the existing YACC based parser. => Set up a test script that checks compatability between SDF and YACC based parsers => In order to achieve this abstract syntax trees should be desugared. => In order to implement a desugarer we need the signature of trees produced by the new parser. [Step 17] Derive the signature of Stratego from the syntax definition > parse -l sdf -v 2.1 -I -i stratego.def -o stratego.adef sdf2sig -i stratego.adef -o stratego.ar ast2abox -p /home/visser/res/app/tiger/tmp/front-tiger-0.2/sig/stratego.pp -i stratego.ar -o stratego.ar.abox abox2text -i stratego.ar.abox -o stratego.r unpack stratego.r r less stratego.r => Inspecting the signature unveiled a couple of wrong constructor names (forgot about Var and SVar) [Step 18] Added reject rules for reserved words => The performance of generated parsers was not very good and that time was spent filtering syntax trees. It occurred to me that I had not declared the reserved words of the Stratego. Added productions of the from |"keyword" -> Id {reject}| for all reserved words. => Module names used in import sections are allowed to be reserved words except for "rules", "strategies", "signature", and "overlays". Created separate lexical sort ModName to reflect this. [Step 19] Add stratego.0.6.2 to the grammar base. [Step 20] Project finished. Future work: => Improve the pretty-print table to obtain beautyfier for Stratego specifications.
gen-renamed-sdf-module
generates an SDF module that renames all SDF sorts in a given SDF definition.
definition module Expressions imports Identifiers [Id => MyId] exports sorts Exp context-free syntax Id -> Exp {cons("Var")} IntConst -> Exp {cons("Int")} Exp "+" Exp -> Exp {left, cons("Plus")} lexical syntax [\ \t\n] -> LAYOUT [0-9]+ -> IntConst module Identifiers exports sorts Id lexical syntax [a-zA-Z]+ -> IdInvocation:
> gen-renamed-sdf-module -i Exp.def -m Expressions --name Exp-Prefixed --prefix ExpResult:
module Exp-Prefixed imports Expressions [ IntConst => ExpIntConst MyId => ExpMyId Exp => ExpExp ]